Search results for "Gibbs sampling"

showing 10 items of 18 documents

Recycling Gibbs sampling

2017

Gibbs sampling is a well-known Markov chain Monte Carlo (MCMC) algorithm, extensively used in signal processing, machine learning and statistics. The key point for the successful application of the Gibbs sampler is the ability to draw samples from the full-conditional probability density functions efficiently. In the general case this is not possible, so in order to speed up the convergence of the chain, it is required to generate auxiliary samples. However, such intermediate information is finally disregarded. In this work, we show that these auxiliary samples can be recycled within the Gibbs estimators, improving their efficiency with no extra cost. Theoretical and exhaustive numerical co…

Computer scienceMonte Carlo methodSlice samplingMarkov processProbability density function02 engineering and technologyMachine learningcomputer.software_genre01 natural sciencesHybrid Monte Carlo010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing0202 electrical engineering electronic engineering information engineering0101 mathematicsComputingMilieux_MISCELLANEOUSbusiness.industryRejection samplingEstimator020206 networking & telecommunicationsMarkov chain Monte CarlosymbolsArtificial intelligencebusiness[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingcomputerAlgorithmGibbs sampling2017 25th European Signal Processing Conference (EUSIPCO)
researchProduct

Efficient anomaly detection on sampled data streams with contaminated phase I data

2020

International audience; Control chart algorithms aim to monitor a process over time. This process consists of two phases. Phase I, also called the learning phase, estimates the normal process parameters, then in Phase II, anomalies are detected. However, the learning phase itself can contain contaminated data such as outliers. If left undetected, they can jeopardize the accuracy of the whole chart by affecting the computed parameters, which leads to faulty classifications and defective data analysis results. This problem becomes more severe when the analysis is done on a sample of the data rather than the whole data. To avoid such a situation, Phase I quality must be guaranteed. The purpose…

Computer scienceSample (material)0211 other engineering and technologies02 engineering and technology[INFO.INFO-SE]Computer Science [cs]/Software Engineering [cs.SE]01 natural sciences[INFO.INFO-IU]Computer Science [cs]/Ubiquitous Computing010104 statistics & probabilitysymbols.namesake[INFO.INFO-CR]Computer Science [cs]/Cryptography and Security [cs.CR]ChartControl chartEWMA chart0101 mathematics021103 operations researchData stream miningbusiness.industryPattern recognition[INFO.INFO-MO]Computer Science [cs]/Modeling and Simulation[INFO.INFO-MA]Computer Science [cs]/Multiagent Systems [cs.MA]OutliersymbolsAnomaly detection[INFO.INFO-ET]Computer Science [cs]/Emerging Technologies [cs.ET]Artificial intelligence[INFO.INFO-DC]Computer Science [cs]/Distributed Parallel and Cluster Computing [cs.DC]businessGibbs sampling
researchProduct

Adaptive independent sticky MCMC algorithms

2018

In this work, we introduce a novel class of adaptive Monte Carlo methods, called adaptive independent sticky MCMC algorithms, for efficient sampling from a generic target probability density function (pdf). The new class of algorithms employs adaptive non-parametric proposal densities which become closer and closer to the target as the number of iterations increases. The proposal pdf is built using interpolation procedures based on a set of support points which is constructed iteratively based on previously drawn samples. The algorithm's efficiency is ensured by a test that controls the evolution of the set of support points. This extra stage controls the computational cost and the converge…

FOS: Computer and information sciencesMathematical optimizationAdaptive Markov chain Monte Carlo (MCMC)Monte Carlo methodBayesian inferenceHASettore SECS-P/05 - Econometrialcsh:TK7800-8360Machine Learning (stat.ML)02 engineering and technologyBayesian inference01 natural sciencesStatistics - Computationlcsh:Telecommunication010104 statistics & probabilitysymbols.namesakeAdaptive Markov chain Monte Carlo (MCMC); Adaptive rejection Metropolis sampling (ARMS); Bayesian inference; Gibbs sampling; Hit and run algorithm; Metropolis-within-Gibbs; Monte Carlo methods; Signal Processing; Hardware and Architecture; Electrical and Electronic EngineeringGibbs samplingStatistics - Machine Learninglcsh:TK5101-67200202 electrical engineering electronic engineering information engineeringComputational statisticsMetropolis-within-GibbsHit and run algorithm0101 mathematicsElectrical and Electronic EngineeringGaussian processComputation (stat.CO)MathematicsSignal processinglcsh:Electronics020206 networking & telecommunicationsMarkov chain Monte CarloMonte Carlo methodsHardware and ArchitectureSignal ProcessingSettore SECS-S/03 - Statistica EconomicasymbolsSettore SECS-S/01 - StatisticaStatistical signal processingGibbs samplingAdaptive rejection Metropolis sampling (ARMS)EURASIP Journal on Advances in Signal Processing
researchProduct

The Recycling Gibbs sampler for efficient learning

2018

Monte Carlo methods are essential tools for Bayesian inference. Gibbs sampling is a well-known Markov chain Monte Carlo (MCMC) algorithm, extensively used in signal processing, machine learning, and statistics, employed to draw samples from complicated high-dimensional posterior distributions. The key point for the successful application of the Gibbs sampler is the ability to draw efficiently samples from the full-conditional probability density functions. Since in the general case this is not possible, in order to speed up the convergence of the chain, it is required to generate auxiliary samples whose information is eventually disregarded. In this work, we show that these auxiliary sample…

FOS: Computer and information sciencesMonte Carlo methodSlice samplingInferenceMachine Learning (stat.ML)02 engineering and technologyBayesian inferenceStatistics - Computation01 natural sciencesMachine Learning (cs.LG)010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingStatistics - Machine LearningArtificial IntelligenceStatistics0202 electrical engineering electronic engineering information engineering0101 mathematicsElectrical and Electronic EngineeringGaussian processComputation (stat.CO)ComputingMilieux_MISCELLANEOUSMathematicsChain rule (probability)Applied Mathematics020206 networking & telecommunicationsMarkov chain Monte CarloStatistics::ComputationComputer Science - LearningComputational Theory and MathematicsSignal ProcessingsymbolsComputer Vision and Pattern RecognitionStatistics Probability and UncertaintyAlgorithm[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingGibbs samplingDigital Signal Processing
researchProduct

Grapham: Graphical models with adaptive random walk Metropolis algorithms

2008

Recently developed adaptive Markov chain Monte Carlo (MCMC) methods have been applied successfully to many problems in Bayesian statistics. Grapham is a new open source implementation covering several such methods, with emphasis on graphical models for directed acyclic graphs. The implemented algorithms include the seminal Adaptive Metropolis algorithm adjusting the proposal covariance according to the history of the chain and a Metropolis algorithm adjusting the proposal scale based on the observed acceptance probability. Different variants of the algorithms allow one, for example, to use these two algorithms together, employ delayed rejection and adjust several parameters of the algorithm…

FOS: Computer and information sciencesStatistics and ProbabilityMarkov chainAdaptive algorithmApplied MathematicsRejection samplingMarkov chain Monte CarloMultiple-try MetropolisStatistics - ComputationStatistics::ComputationComputational Mathematicssymbols.namesakeMetropolis–Hastings algorithmComputational Theory and MathematicssymbolsGraphical modelAlgorithmComputation (stat.CO)MathematicsGibbs samplingComputational Statistics & Data Analysis
researchProduct

Multivariate exponential smoothing: A Bayesian forecast approach based on simulation

2009

This paper deals with the prediction of time series with correlated errors at each time point using a Bayesian forecast approach based on the multivariate Holt-Winters model. Assuming that each of the univariate time series comes from the univariate Holt-Winters model, all of them sharing a common structure, the multivariate Holt-Winters model can be formulated as a traditional multivariate regression model. This formulation facilitates obtaining the posterior distribution of the model parameters, which is not analytically tractable: simulation is needed. An acceptance sampling procedure is used in order to obtain a sample from this posterior distribution. Using Monte Carlo integration the …

Numerical AnalysisMultivariate statisticsGeneral Computer ScienceApplied MathematicsUnivariateMarkov chain Monte CarloTheoretical Computer ScienceNormal-Wishart distributionsymbols.namesakeUnivariate distributionModeling and SimulationStatisticssymbolsMultivariate t-distributionBayesian linear regressionGibbs samplingMathematicsMathematics and Computers in Simulation
researchProduct

Poisson Regression with Change-Point Prior in the Modelling of Disease Risk around a Point Source

2003

Bayesian estimation of the risk of a disease around a known point source of exposure is considered. The minimal requirements for data are that cases and populations at risk are known for a fixed set of concentric annuli around the point source, and each annulus has a uniquely defined distance from the source. The conventional Poisson likelihood is assumed for the counts of disease cases in each annular zone with zone-specific relative risk and parameters and, conditional on the risks, the counts are considered to be independent. The prior for the relative risk parameters is assumed to be piecewise constant at the distance having a known number of components. This prior is the well-known cha…

Statistics and ProbabilityBayes estimatorPoint sourcePosterior probabilityGeneral MedicineConditional probability distributionPoisson distributionsymbols.namesakePrior probabilityStatisticssymbolsPoisson regressionStatistics Probability and UncertaintyGibbs samplingMathematicsBiometrical Journal
researchProduct

Statistical inference and Monte Carlo algorithms

1996

This review article looks at a small part of the picture of the interrelationship between statistical theory and computational algorithms, especially the Gibbs sampler and the Accept-Reject algorithm. We pay particular attention to how the methodologies affect and complement each other.

Statistics and ProbabilityDecision theoryMonte Carlo methodMarkov chain Monte CarloStatistics::ComputationComplement (complexity)symbols.namesakeStatistical inferencesymbolsMonte Carlo method in statistical physicsStatistics Probability and UncertaintyStatistical theoryAlgorithmGibbs samplingMathematicsTest
researchProduct

Establishing some order amongst exact approximations of MCMCs

2016

Exact approximations of Markov chain Monte Carlo (MCMC) algorithms are a general emerging class of sampling algorithms. One of the main ideas behind exact approximations consists of replacing intractable quantities required to run standard MCMC algorithms, such as the target probability density in a Metropolis-Hastings algorithm, with estimators. Perhaps surprisingly, such approximations lead to powerful algorithms which are exact in the sense that they are guaranteed to have correct limiting distributions. In this paper we discover a general framework which allows one to compare, or order, performance measures of two implementations of such algorithms. In particular, we establish an order …

Statistics and ProbabilityFOS: Computer and information sciences65C05Mathematical optimizationMonotonic function01 natural sciencesStatistics - ComputationPseudo-marginal algorithm010104 statistics & probabilitysymbols.namesake60J05martingale couplingalgoritmitFOS: MathematicsApplied mathematics60J220101 mathematicsComputation (stat.CO)Mathematics65C40 (Primary) 60J05 65C05 (Secondary)Martingale couplingMarkov chainmatematiikkapseudo-marginal algorithm010102 general mathematicsProbability (math.PR)EstimatorMarkov chain Monte Carloconvex orderDelta methodMarkov chain Monte CarloOrder conditionsymbolsStatistics Probability and UncertaintyAsymptotic variance60E15Martingale (probability theory)Convex orderMathematics - ProbabilityGibbs sampling
researchProduct

MCMC methods to approximate conditional predictive distributions

2006

Sampling from conditional distributions is a problem often encountered in statistics when inferences are based on conditional distributions which are not of closed-form. Several Markov chain Monte Carlo (MCMC) algorithms to simulate from them are proposed. Potential problems are pointed out and some suitable modifications are suggested. Approximations based on conditioning sets are also explored. The issues are illustrated within a specific statistical tool for Bayesian model checking, and compared in an example. An example in frequentist conditional testing is also given.

Statistics and ProbabilityMarkov chainApplied MathematicsMarkov chain Monte CarloConditional probability distributionBayesian inferenceComputational Mathematicssymbols.namesakeMetropolis–Hastings algorithmComputational Theory and MathematicsSampling distributionFrequentist inferencesymbolsEconometricsAlgorithmMathematicsGibbs samplingComputational Statistics & Data Analysis
researchProduct